Should a Systematic Review Be Required in a Clinical Trial Report? Perhaps, But Not Yet

您所在的位置:网站首页 refworks tagged Should a Systematic Review Be Required in a Clinical Trial Report? Perhaps, But Not Yet

Should a Systematic Review Be Required in a Clinical Trial Report? Perhaps, But Not Yet

#Should a Systematic Review Be Required in a Clinical Trial Report? Perhaps, But Not Yet | 来源: 网络整理| 查看: 265

It has been argued that only research supported by systematic reviews or a robustly demonstrated research gap should be funded and that scientific papers reporting on randomized clinical trials (RCTs) should reference prior systematic reviews. Jia et al’s1 metaresearch study evaluated RCTs that were published at least 2 years after an initial Cochrane review was released and were included in subsequent revisions of that review. Across 737 Cochrane reviews and 4003 RCTs, 31.0% of RCTs cited prior Cochrane reviews, 42.4% cited non-Cochrane reviews, and 56.6% cited either or both types of review. Stratified analyses revealed differences across clinical specialties and a higher likelihood of systematic reviews being cited by RCTs that had 100 or more participants, were not funded by industry, or were reported by authors from developed countries. If the core issue were the mere citation of a systematic review or meta-analysis in a trial report, the study indeed points at a marked deficiency. The issues may be more complex.

Mass Production of Systematic Reviews and Meta-analyses

In a 2016 study2 that is still relevant today, Ioannidis reported that from 1986 to 2015, PubMed tagged 266 782 entries as systematic reviews and 58 611 entries as meta-analyses. Between 1991 and 2014, yearly publications indexed in PubMed increased by 153% but surged by 2728% for systematic reviews and 2635% for meta-analyses. Of even greater relevance was the classification of meta-analyses (and, presumably, associated systematic reviews) by quality and relevance. In addition to speculating that 20% of meta-analyses went unpublished, Ioannidis classified 27% as “redundant and unnecessary,” 17% as “decent but not useful,” 13% as “misleading” and pertaining to “abandoned genetics,” and 20% as “flawed beyond repair.” That left 3% of meta-analyses judged to be “decent and clinically useful.”

Misnomer of the Systematic Review Tag

Having an article tagged as a systematic review in PubMed does not mean that the study reports a systematic review comprehensively in terms of search, extraction, and classification and synthesis of evidence.2 Word count limits imposed by journals make it difficult to achieve the necessary transparency in methods and results for a systematic review and a meta-analysis. Combined papers often give short shrift to the systematic review section, limiting it to superficial counts and occasionally referring to additional materials in the supplement.

Systematic Reviews With and Without Meta-analyses

In 2014, there were 28 959 systematic reviews (76%) and 9135 meta-analyses (24%) indexed in PubMed.2 The gap between these proportions suggests that many and perhaps most systematic reviews do not lead to meta-analyses, which demand more time to conduct. A more encouraging statistic3 is that of 682 systematic reviews indexed in Medline in February 2014, 63% included a meta-analysis. Regardless, some opportunistic productivity may be at play.

Threats to Quality and Validity

As part of a 3-article journal debate on the usefulness of systematic reviews, Møller and colleagues4 took the unsure position between a fervent yes and an emphatic no. They acknowledged the relevance of systematic reviews and meta-analyses to clinical guidance, statistical power, precision of effect estimates, and, in particular, detection of differences in adverse events. They also expressed strong concern about quality owing to “risk of bias, indirectness, imprecision, inconsistency, and publication bias.” Even more compelling was their concern about the validity of systematic reviews and meta-analyses, including a “lack of systematic and transparent conduct and reporting, poor methodological quality of the included studies, risk of random errors, unrecognized and unaccounted statistical and clinical heterogeneity, data dredging in nonpredefined statistical analyses, and lack of assessment of the overall quality of evidence.”

Citation Snowballing and Spiking

Meta-analyses, more so than systematic reviews, tend to be cited frequently, whether somewhat deferentially in an article’s first paragraph or more substantively elsewhere in a paper. A snowball outcome is not unlikely, in which a meta-analysis continues to receive citations because authors see it cited frequently and assume this to be a mark of quality (or include it preemptively in case 1 of the authors is to serve as a reviewer). Authors of systematic reviews and meta-analyses like this given that it spikes their online search and other metrics. Editors and publishers do so, too, given that it is associated with increased impact factors, among other indexes.

Retrospective and Incremental vs Innovation and Breakthrough

Requiring every clinical trial report to include a systematic review, let alone a meta-analysis, may impair innovation and its communication, especially for breakthroughs. Systematic reviews and meta-analyses are retrospective appraisals of past evidence, often with quite some time lag. Requiring their inclusion may frame a trial report as incremental and imply that science is incremental, which it is mainly but not completely. Should the phase 3 trial report5 of first-in-class imatinib, a breakthrough in the treatment of chronic myelogenous leukemia, have included a systematic review of prior treatments? Hematologists knew the limitations of interferon in efficacy and safety. Patients with chronic myelogenous leukemia treated with interferon experienced severe side effects and a limited increase in life expectancy. Perhaps less extreme an example: what if a systematic review is available but does not add substantively to a report, apart from a perfunctory, nondescript, check-off-the-box citation in a typical first paragraph? The pivotal trial report of tebentafusp, a breakthrough treatment for metastatic uveal melanoma, was a case in point.6

Conservativism of Science

Being intrinsically a retrospective approach, systematic reviews summarize and consolidate what is known, not what remains unknown. Indirectly, they contribute to the inherent conservativism in science. Grant proposals with what reviewers may consider far out ideas may not get funded and articles with these ideas may not be accepted. Those at the edge may get the “need more preliminary data” sanction. Worse, beyond-the-horizon ideas may get ridiculed, their presentations challenged, conference abstracts and papers spurned, and grant proposals rejected because they go against the grain, as Marshall and Warren experienced with their Nobel Prize–winning work on Helicobacter pylori.7

The major contribution of the Jia et al1 study is that it reasserts, with data, the need for careful, thorough, and transparent reporting of the state of the science in a particular area of inquiry. Including, not just deferentially citing, retrospective syntheses of evidence is indicated for areas where progress is incremental, where significant prior evidence has accumulated, and when doing so exposes the benefit of a new trial. This is true on 1 condition, however: the evidence synthesis should meet the Ioannidis2 criterion of “decent and clinically useful.” That may leave a few systematic reviews and meta-analyses, covering a few areas.

Furthermore, despite extensive guidelines for conducting systematic reviews, they remain interpretive methods. When used to support the publication of a trial report, reviews may be vulnerable to selection bias by exclusively supporting the scope and objectives of a study rather than balancing it with opposing syntheses of evidence. Priority should be given to systematic reviews combined with a meta-analysis, even if they need to be published separately. If several systematic reviews and meta-analyses have been published, a rationale should be given why 1 merited selection over others.

Should a systematic review be required in a clinical trial report? Jia and colleagues1 offer a progressive idea. Nevertheless, in the absence of authoritative criteria of quality, relevance, and transparency, a directive would be premature. Considering the extremes, from validation studies to breakthrough innovations, from the known to unknown, guidance should prevail over mandate.

Back to top Article Information

Published: March 23, 2023. doi:10.1001/jamanetworkopen.2023.4226

Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2023 Abraham I et al. JAMA Network Open.

Corresponding Author: Ivo Abraham, PhD, Center for Health Outcomes and Pharmacoeconomic Research, R. Ken Coit College of Pharmacy, University of Arizona, 1295 N Martin, Tucson, AZ 85721 ([email protected]).

Conflict of Interest Disclosures: Dr Abraham reported holding equity in Matrix45, LLC outside the submitted work and serving as quantitative methods editor for JAMA Dermatology (compensated) and editor-in-chief for the Journal of Medical Economics (noncompensated but with an annual allotment of waivers of publication charges). Dr Calamia reported receiving personal fees from the National Health Care Institute and Matrix45 outside the submitted work. Dr MacDonald reported being employed by and holding equity in Matrix45, LLC outside the submitted work.

Additional Information: Drs Abraham and MacDonald have published several systematic reviews and meta-analyses.

References 1.Jia  Y, Li  B, Yang  Z,  et al.  Trends of randomized clinical trials citing prior systematic reviews, 2007-2021.   JAMA Netw Open. 2022;6(3):e234219. doi:10.1001/jamanetworkopen.2023.4219Google Scholar2.Ioannidis  JPA.  The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses.   Milbank Q. 2016;94(3):485-514. doi:10.1111/1468-0009.12210PubMedGoogle ScholarCrossref 3.Page  MJ, Shamseer  L, Altman  DG,  et al.  Epidemiology and reporting characteristics of systematic reviews of biomedical research: a cross-sectional study.   PLoS Med. 2016;13(5):e1002028. doi:10.1371/journal.pmed.1002028PubMedGoogle ScholarCrossref 4.Møller  MH, Ioannidis  JPA, Darmon  M.  Are systematic reviews and meta-analyses still useful research: we are not sure.   Intensive Care Med. 2018;44(4):518-520. doi:10.1007/s00134-017-5039-yPubMedGoogle ScholarCrossref 5.O’Brien  SG, Guilhot  F, Larson  RA,  et al; IRIS Investigators.  Imatinib compared with interferon and low-dose cytarabine for newly diagnosed chronic-phase chronic myeloid leukemia.   N Engl J Med. 2003;348(11):994-1004. doi:10.1056/NEJMoa022457PubMedGoogle ScholarCrossref 6.Nathan  P, Hassel  JC, Rutkowski  P,  et al; IMCgp100-202 Investigators.  Overall survival benefit with tebentafusp in metastatic uveal melanoma.   N Engl J Med. 2021;385(13):1196-1206. doi:10.1056/NEJMoa2103485PubMedGoogle ScholarCrossref 7.Pincock  S.  Nobel Prize winners Robin Warren and Barry Marshall.   Lancet. 2005;366(9495):1429. doi:10.1016/S0140-6736(05)67587-3PubMedGoogle ScholarCrossref


【本文地址】


今日新闻


推荐新闻


    CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3